1,126 research outputs found

    SOME FAMOUS FOLKLORE CLAIMS OF GANDHAMARDAN HILL TRIBALS, NRUSINGHANATH, ODISHA

    Get PDF
    The use of medicinal plants is a fundamental component of Indian traditional healthcare system. It is the oldest and most used health care system among all therapeutic systems. In many parts of India and specifically here in Odisha this traditional system of healing is the mainstay healthcare system. These undocumented but locally proven system has a potential for research for the benefit of human race which is all most at end point of contemporary healthcare system. The present article deals with the database usage of nearly 14 species of medicinal plants found around Gandhamardhan hills, Nrusinghnath forest areas in odisha and its important folklore claims

    Secure and Sustainable Load Balancing of Edge Data Centers in Fog Computing

    Full text link
    © 1979-2012 IEEE. Fog computing is a recent research trend to bring cloud computing services to network edges. EDCs are deployed to decrease the latency and network congestion by processing data streams and user requests in near real time. EDC deployment is distributed in nature and positioned between cloud data centers and data sources. Load balancing is the process of redistributing the work load among EDCs to improve both resource utilization and job response time. Load balancing also avoids a situation where some EDCs are heavily loaded while others are in idle state or doing little data processing. In such scenarios, load balancing between the EDCs plays a vital role for user response and real-Time event detection. As the EDCs are deployed in an unattended environment, secure authentication of EDCs is an important issue to address before performing load balancing. This article proposes a novel load balancing technique to authenticate the EDCs and find less loaded EDCs for task allocation. The proposed load balancing technique is more efficient than other existing approaches in finding less loaded EDCs for task allocation. The proposed approach not only improves efficiency of load balancing; it also strengthens the security by authenticating the destination EDCs

    Quantum Circuits for the Unitary Permutation Problem

    Full text link
    We consider the Unitary Permutation problem which consists, given nn unitary gates U1,,UnU_1, \ldots, U_n and a permutation σ\sigma of {1,,n}\{1,\ldots, n\}, in applying the unitary gates in the order specified by σ\sigma, i.e. in performing Uσ(n)Uσ(1)U_{\sigma(n)}\ldots U_{\sigma(1)}. This problem has been introduced and investigated by Colnaghi et al. where two models of computations are considered. This first is the (standard) model of query complexity: the complexity measure is the number of calls to any of the unitary gates UiU_i in a quantum circuit which solves the problem. The second model provides quantum switches and treats unitary transformations as inputs of second order. In that case the complexity measure is the number of quantum switches. In their paper, Colnaghi et al. have shown that the problem can be solved within n2n^2 calls in the query model and n(n1)2\frac{n(n-1)}2 quantum switches in the new model. We refine these results by proving that nlog2(n)+Θ(n)n\log_2(n) +\Theta(n) quantum switches are necessary and sufficient to solve this problem, whereas n22n+4n^2-2n+4 calls are sufficient to solve this problem in the standard quantum circuit model. We prove, with an additional assumption on the family of gates used in the circuits, that n2o(n7/4+ϵ)n^2-o(n^{7/4+\epsilon}) queries are required, for any ϵ>0\epsilon >0. The upper and lower bounds for the standard quantum circuit model are established by pointing out connections with the permutation as substring problem introduced by Karp.Comment: 8 pages, 5 figure

    Personal Internet of Things (PIoT): what is it exactly

    Get PDF
    This is an accepted manuscript of an article published by IEEE in IEEE Consumer Electronics Magazine, available online: https://ieeexplore.ieee.org/document/9431670 The accepted version of the publication may differ from the final published version.The use of Internet of Things (IoT) devices in homes and the immediate proximity of an individual communicates to create Personal IoT (PIoT) networks. The exploratory study of PIoT is in its infancy, which will explore the expansion of new use cases, service requirements, and the proliferation of PIoT devices. This article provides a big picture of PIoT architecture, vision, and future research scope.Published versio

    A Comparison of Machine Learning Methods for Cross-Domain Few-Shot Learning

    Get PDF
    We present an empirical evaluation of machine learning algorithms in cross-domain few-shot learning based on a fixed pre-trained feature extractor. Experiments were performed in five target domains (CropDisease, EuroSAT, Food101, ISIC and ChestX) and using two feature extractors: a ResNet10 model trained on a subset of ImageNet known as miniImageNet and a ResNet152 model trained on the ILSVRC 2012 subset of ImageNet. Commonly used machine learning algorithms including logistic regression, support vector machines, random forests, nearest neighbour classification, naïve Bayes, and linear and quadratic discriminant analysis were evaluated on the extracted feature vectors. We also evaluated classification accuracy when subjecting the feature vectors to normalisation using p-norms. Algorithms originally developed for the classification of gene expression data—the nearest shrunken centroid algorithm and LDA ensembles obtained with random projections—were also included in the experiments, in addition to a cosine similarity classifier that has recently proved popular in few-shot learning. The results enable us to identify algorithms, normalisation methods and pre-trained feature extractors that perform well in cross-domain few-shot learning. We show that the cosine similarity classifier and ℓ² -regularised 1-vs-rest logistic regression are generally the best-performing algorithms. We also show that algorithms such as LDA yield consistently higher accuracy when applied to ℓ² -normalised feature vectors. In addition, all classifiers generally perform better when extracting feature vectors using the ResNet152 model instead of the ResNet10 model

    Ethics, Nanobiosensors and Elite Sport: The Need for a New Governance Framework

    Get PDF
    Individual athletes, coaches and sports teams seek continuously for ways to improve performance and accomplishment in elite competition. New techniques of performance analysis are a crucial part of the drive for athletic perfection. This paper discusses the ethical importance of one aspect of the future potential of performance analysis in sport, combining the field of biomedicine, sports engineering and nanotechnology in the form of ‘Nanobiosensors’. This innovative technology has the potential to revolutionise sport, enabling real time biological data to be collected from athletes that can be electronically distributed. Enabling precise real time performance analysis is not without ethical problems. Arguments concerning (1) data ownership and privacy; (2) data confidentiality; and (3) athlete welfare are presented alongside a discussion of the use of the Precautionary Principle in making ethical evaluations. We conclude, that although the future potential use of Nanobiosensors in sports analysis offers many potential benefits, there is also a fear that it could be abused at a sporting system level. Hence, it is essential for sporting bodies to consider the development of a robust ethically informed governance framework in advance of their proliferated use

    Azimuthal Anisotropy of Photon and Charged Particle Emission in Pb+Pb Collisions at 158 A GeV/c

    Full text link
    The azimuthal distributions of photons and charged particles with respect to the event plane are investigated as a function of centrality in Pb + Pb collisions at 158 A GeV/c in the WA98 experiment at the CERN SPS. The anisotropy of the azimuthal distributions is characterized using a Fourier analysis. For both the photon and charged particle distributions the first two Fourier coefficients are observed to decrease with increasing centrality. The observed anisotropies of the photon distributions compare well with the expectations from the charged particle measurements for all centralities.Comment: 8 pages and 6 figures. The manuscript has undergone a major revision. The unwanted correlations were enhanced in the random subdivision method used in the earlier version. The present version uses the more established method of division into subevents separated in rapidity to minimise short range correlations. The observed results for charged particles are in agreement with results from the other experiments. The observed anisotropy in photons is explained using flow results of pions and the correlations arising due to the decay of the neutral pion

    Multiplicity Distributions and Charged-neutral Fluctuations

    Get PDF
    Results from the multiplicity distributions of inclusive photons and charged particles, scaling of particle multiplicities, event-by-event multiplicity fluctuations, and charged-neutral fluctuations in 158A\cdot A GeV Pb+Pb collisions are presented and discussed. A scaling of charged particle multiplicity as Npart1.07±0.05N_{part}^{1.07\pm 0.05} and photons as Npart1.12±0.03N_{part}^{1.12\pm 0.03} have been observed, indicating violation of naive wounded nucleon model. The analysis of localized charged-neutral fluctuation indicates a model-independent demonstration of non-statistical fluctuations in both charged particles and photons in limited azimuthal regions. However, no correlated charged-neutral fluctuations are observed.Comment: Talk given at the International Symposium on Nuclear Physics (ISNP-2000), Mumbai, India, 18-22 Dec 2000, Proceedings to be published in Pramana, Journal of Physic

    Production of phi mesons at mid-rapidity in sqrt(s_NN) = 200 GeV Au+Au collisions at RHIC

    Get PDF
    We present the first results of meson production in the K^+K^- decay channel from Au+Au collisions at sqrt(s_NN) = 200 GeV as measured at mid-rapidity by the PHENIX detector at RHIC. Precision resonance centroid and width values are extracted as a function of collision centrality. No significant variation from the PDG accepted values is observed. The transverse mass spectra are fitted with a linear exponential function for which the derived inverse slope parameter is seen to be constant as a function of centrality. These data are also fitted by a hydrodynamic model with the result that the freeze-out temperature and the expansion velocity values are consistent with the values previously derived from fitting single hadron inclusive data. As a function of transverse momentum the collisions scaled peripheral.to.central yield ratio RCP for the is comparable to that of pions rather than that of protons. This result lends support to theoretical models which distinguish between baryons and mesons instead of particle mass for explaining the anomalous proton yield.Comment: 326 authors, 24 pages text, 23 figures, 6 tables, RevTeX 4. To be submitted to Physical Review C as a regular article. Plain text data tables for the points plotted in figures for this and previous PHENIX publications are (or will be) publicly available at http://www.phenix.bnl.gov/papers.htm

    Extreme sensitivity in Snowball Earth formation to mountains on PaleoProterozoic supercontinents

    Get PDF
    During the PaleoProterozoic 2.45 to 2.2 billion years ago, several glaciations may have produced Snowball Earths. These glacial cycles occurred during large environmental change when atmospheric oxygen was increasing, a supercontinent was assembled from numerous landmasses, and collisions between these landmasses formed mountain ranges. Despite uncertainties in the composition of the atmosphere and reconstruction of the landmasses, paleoclimate model simulations can test the sensitivity of the climate to producing a Snowball Earth. Here we present a series of simulations that vary the atmospheric methane concentration and latitudes of west–east-oriented mountain ranges on an idealised supercontinent. For a given methane concentration, the latitudes of mountains control whether a Snowball Earth forms or not. Significantly, mountains in middle latitudes inhibited Snowball Earth formation, and mountains in low latitudes promoted Snowball Earth formation, with the supercontinent with mountains at ±30° being most conducive to forming a Snowball Earth because of reduced albedo at low latitudes. We propose that the extreme sensitivity of a Snowball Earth to reconstructions of the paleogeography and paleoatmospheric composition may explain the observed glaciations, demonstrating the importance of high-quality reconstructions to improved understanding of this early period in Earth’s history
    corecore